sa 2
A Appendix: Proofs and Algorithms A.1 Proofs of results in Section 4 Proof of Proposition 4.1. Plug B
(Bertsekas, 1999). Algorithm 1. Furthermore, we call ˆ f (), X We can show that | f () ˆ f () |, 8 2 [, ] . Besides, computing the upper bound claimed in Proposition 4.2 requires finding The second equality is from the fact that the objective function is affine w.r.t. Finally, we verify the rest two components. Finally, we verify the rest two components. This finishes the proof of our claim.
Scale-Aware Relay and Scale-Adaptive Loss for Tiny Object Detection in Aerial Images
Li, Jinfu, Huang, Yuqi, Song, Hong, Wang, Ting, Xia, Jianghan, Lin, Yucong, Fan, Jingfan, Yang, Jian
Recently, despite the remarkable advancements in object detection, modern detectors still struggle to detect tiny objects in aerial images. One key reason is that tiny objects carry limited features that are inevitably degraded or lost during long-distance network propagation. Another is that smaller objects receive disproportionately greater regression penalties than larger ones during training. To tackle these issues, we propose a Scale-Aware Relay Layer (SARL) and a Scale-Adaptive Loss (SAL) for tiny object detection, both of which are seamlessly compatible with the top-performing frameworks. Specifically, SARL employs a cross-scale spatial-channel attention to progressively enrich the meaningful features of each layer and strengthen the cross-layer feature sharing. SAL reshapes the vanilla IoU-based losses so as to dynamically assign lower weights to larger objects. This loss is able to focus training on tiny objects while reducing the influence on large objects. Extensive experiments are conducted on three benchmarks (\textit{i.e.,} AI-TOD, DOTA-v2.0 and VisDrone2019), and the results demonstrate that the proposed method boosts the generalization ability by 5.5\% Average Precision (AP) when embedded in YOLOv5 (anchor-based) and YOLOx (anchor-free) baselines. Moreover, it also promotes the robust performance with 29.0\% AP on the real-world noisy dataset (\textit{i.e.,} AI-TOD-v2.0).
- Asia > China > Beijing > Beijing (0.05)
- Africa > Central African Republic > Ombella-M'Poko > Bimbo (0.04)
Interval-valued fuzzy soft $\beta$-covering approximation spaces
Subsequently, soft sets and rough sets frequently inspires the exploration Gorzalczany [16] introduced the notion of of theories related to soft covering-based rough interval-valued fuzzy sets, where the membership degree sets [2, 3, 11, 13, 43], attaining substantial relevance of set elements lies within the interval [0,1]. in specific domains. However, in fuzzy environments, Interval-valued fuzzy sets are adept at handling scenarios rough set theory demonstrates inherent limitations, as where precise probabilities of set membership discussed in [42]. To overcome these challenges, Zhang are elusive, offering instead an interval within which and Zhan[42] integrated fuzzy sets, soft sets, and rough such probabilities are constrained [16, 29].